Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Minimally Invasive Surgeries can benefit from having miniaturized sensors on surgical graspers to provide additional information to the surgeons. In this work, a 6 mm ultrasound transducer was added to a surgical grasper, intended to measure acoustic properties of the tissue. However, the ultrasound sensor has a ringing artifact arising from the decaying oscillation of its piezo element, and at short travel distances, the artifact blends with the acoustic echo. Without a method to remove the artifact from the blended signal, this makes it impossible to measure one of the main characteristics of an ultrasound waveform – Time of Flight. In this paper, six filtering methods to clear the artifact from the ultrasound waveform were compared: Bandpass filter, Adaptive Least Mean Squares (LMS) filter, Spectrum Suppression (SPS), Recurrent Neural Network (RNN), Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU). Following each filtering method, four time of flight extraction methods were compared: Magnitude Threshold, Envelope Peak Detection, Cross-correlation and Short-time Fourier Transform (STFT). The RNN with Cross-correlation method pair was shown to be optimal for this task, performing with the root mean square error of 3.6 %.more » « less
-
Minimally-Invasive Surgeries can benefit from having miniaturized sensors on surgical graspers to provide additional information to the surgeons. One such potential sensor is an ultrasound transducer. At long travel distances, the ultrasound transducer can accurately measure its ultrasound wave's time of flight, and from it, classify the grasped tissue. However, the ultrasound transducer has a ringing artifact arising from the decaying oscillation of its piezo element, and at short travel distances, the artifact blends with the acoustic echo. Without a method to remove the artifact from the blended signal, this makes it impossible to measure the waveform's time of flight.It is possible to use both classical signal processing and deep learning methods to filter raw ultrasound signals, removing the ringing artifact from them, and from the filtered signals, to obtain the time of flight. In this dataset, two datasets are provided to train and test algorithms developed for filtering out the ringdown artifact, and for subsequently extracting the waveform's time of flight. All measured (raw) signals were collected the same experimental setup: an oscilloscope connected to an ultrasound driver to drive a transducer attached to a liquid water container, in an attempt to mimic tissue properties in a tightly controlled environment.The training dataset consists of two groups of signal pairs. The first group consists of 993 signal pairs, with each pair consisting of a raw ultrasound signal (with the acoustic echo blended with the ringing artifact), and a target filtered signal (with only the desired echo). Signals in the first group are sampled at the original sampling frequency of 500 MHz. The second group is like the first group, but with all signals downsampled by a factor of 26. This training dataset includes only travel distances from 2 cm to 4 cm, inclusively, because at these distances in water, the echo is sufficiently separated from the ringdown artifact to be manually extractable. The signal pairs are approximately equally distributed between the distances covered.The test dataset similarly consists of two groups of raw ultrasound signals. The first group consists of 270 signals, collected at 9 travel distances between 0.5 cm and 4.0 cm, with 30 signals per distance. It also includes the associated true times of flight for each distance. Signals in the first group are sampled at the original sampling frequency of 500 MHz. The second group is like the first group, but with all signals downsampled by a factor of 26. All signals in both datasets are aligned.more » « less
-
Minimally Invasive Surgery lacks tactile feedback that surgeons find useful for finding and diagnosing tissue abnormalities. The goal of this paper is to calibrate sensors of a motorized Smart Grasper surgical instrument to provide accurate force and position measurements. These values serve two functions with the novel calibration hardware. The first is to control the motor of the Grasper to prevent tissue damage. The second is to act as the base upon which future work in multi-modal sensor fusion tissue characterization can be built. Our results show that the Grasper jaw distance is a function of both applied force and motor angle while the force the jaws apply to the tissue can be measured using the internal load cell. All code and data sets used to generate this paper can be found on GitHub at https://github.com/Yana-Sosnovskaya/ Smart Grasper publicmore » « less
-
Laparoscopic surgery presents practical benefits over traditional open surgery, including reduced risk of infection, discomfort and recovery time for patients. Introducing robot systems into surgical tasks provides additional enhancements, including improved precision, remote operation, and an intelligent software layer capable of filtering aberrant motion and scaling surgical maneuvers. However, the software interface in telesurgery also lends itself to potential adversarial cyber attacks. Such attacks can negatively effect both surgeon motion commands and sensory information relayed to the operator. To combat cyber attacks on the latter, one method to enhance surgeon feedback through multiple sensory pathways is to incorporate reliable, complementary forms of information across different sensory modes. Built-in partial redundancies or inferences between perceptual channels, or perception complementarities, can be used both to detect and recover from compromised operator feedback. In surgery, haptic sensations are extremely useful for surgeons to prevent undue and unwanted tissue damage from excessive tool-tissue force. Direct force sensing is not yet deployable due to sterilization requirements of the operating room. Instead, combinations of other sensing methods may be relied upon, such as noncontact model-based force estimation. This paper presents the design of a surgical simulator software that can be used for vision-based non-contact force sensing to inform the perception complementarity of vision and force feedback for telesurgery. A brief user study is conducted to verify the efficacy of graphical force feedback from vision-based force estimation, and suggests that vision may effectively complement direct force sensing.more » « less
-
null (Ed.)Surgical robots have been introduced to operating rooms over the past few decades due to their high sensitivity, small size, and remote controllability. The cable-driven nature of many surgical robots allows the systems to be dexterous and lightweight, with diameters as low as 5mm. However, due to the slack and stretch of the cables and the backlash of the gears, inevitable uncertainties are brought into the kinematics calcu- lation [1]. Since the reported end effector position of surgical robots like RAVEN-II [2] is directly calculated using the motor encoder measurements and forward kinematics, it may contain relatively large error up to 10mm, whereas semi-autonomous functions being introduced into abdominal surgeries require position inaccuracy of at most 1mm. To resolve the problem, a cost-effective, real-time and data-driven pipeline for robot end effector position precision estimation is proposed and tested on RAVEN-II. Analysis shows an improved end effector position error of around 1mm RMS traversing through the entire robot workspace without high-resolution motion tracker. The open source code, data sets, videos, and user guide can be found at //github.com/HaonanPeng/RAVEN Neural Network Estimator.more » « less
An official website of the United States government

Full Text Available